Towards deep learning with spiking neurons in energy based models with contrastive Hebbian plasticity
نویسندگان
چکیده
In machine learning, error back-propagation in multi-layer neural networks (deep learning) has been impressively successful in supervised and reinforcement learning tasks. As a model for learning in the brain, however, deep learning has long been regarded as implausible, since it relies in its basic form on a non-local plasticity rule. To overcome this problem, energy-based models with local contrastive Hebbian learning were proposed and tested on a classification task with networks of rate neurons. We extended this work by implementing and testing such a model with networks of leaky integrate-and-fire neurons. Preliminary results indicate that it is possible to learn a non-linear regression task with hidden layers, spiking neurons and a local synaptic plasticity rule.
منابع مشابه
Models of Metaplasticity: A Review of Concepts
Part of hippocampal and cortical plasticity is characterized by synaptic modifications that depend on the joint activity of the pre- and post-synaptic neurons. To which extent those changes are determined by the exact timing and the average firing rates is still a matter of debate; this may vary from brain area to brain area, as well as across neuron types. However, it has been robustly observe...
متن کاملFunctional Implications of Synaptic Spike Timing Dependent Plasticity and Anti-Hebbian Membrane Potential Dependent Plasticity
Recent extensions of the Perceptron as the Tempotron and the Chronotron sug-gest that this theoretical concept is highly relevant for understanding networks ofspiking neurons in the brain. It is not known, however, how the computationalpower of the Perceptron might be accomplished by the plasticity mechanisms ofreal synapses. Here we prove that spike-timing-dependent plasticity ...
متن کاملEvent-driven contrastive divergence for spiking neuromorphic systems
Restricted Boltzmann Machines (RBMs) and Deep Belief Networks have been demonstrated to perform efficiently in a variety of applications, such as dimensionality reduction, feature learning, and classification. Their implementation on neuromorphic hardware platforms emulating large-scale networks of spiking neurons can have significant advantages from the perspectives of scalability, power dissi...
متن کاملPerfect Associative Learning with Spike-Timing-Dependent Plasticity
Recent extensions of the Perceptron as the Tempotron and the Chronotron suggest that this theoretical concept is highly relevant for understanding networks of spiking neurons in the brain. It is not known, however, how the computational power of the Perceptron might be accomplished by the plasticity mechanisms of real synapses. Here we prove that spike-timing-dependent plasticity having an anti...
متن کاملLearning to Generate Sequences with Combination of Hebbian and Non-hebbian Plasticity in Recurrent Spiking Neural Networks
Synaptic Plasticity, the foundation for learning and memory formation in the human brain, manifests in various forms. Here, we combine the standard spike timing correlation based Hebbian plasticity with a non-Hebbian synaptic decay mechanism for training a recurrent spiking neural model to generate sequences. We show that inclusion of the adaptive decay of synaptic weights with standard STDP he...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1612.03214 شماره
صفحات -
تاریخ انتشار 2016